71 research outputs found

    Channel clustering and QoS level identification scheme for multi-channel cognitive radio networks

    Get PDF
    The increasing popularity of wireless services and devices necessitates high bandwidth requirements; however, spectrum resources are not only limited but also heavily underutilized. Multiple license channels that support the same levels of QoS are desirable to resolve the problems posed by the scarcity and inefficient use of spectrum resources in multi-channel cognitive radio networks (MCRNs). One reason is that multimedia services and applications have distinct, stringent QoS requirements. However, due to a lack of coordination between primary and secondary users, identifying the QoS levels supported over available licensed channels has proven to be problematic and has yet to be attempted. This article presents a novel Bayesian non-parametric channel clustering scheme, which identifies the QoS levels supported over available license channels. The proposed scheme employs the infinite Gaussian mixture model and collapsed Gibbs sampler to identify the QoS levels from the feature space of the bit rate, packet delivery ratio, and packet delay variation of licensed channels. Moreover, the real measurements of wireless data traces and comparisons with baseline clustering schemes are used to evaluate the performance of the proposed scheme. © 1979-2012 IEEE. **Please note that there are multiple authors for this article therefore only the name of the first 5 including Federation University Australia affiliate “Muhammad Imran” is provided in this record*

    Heterogeneity-aware task allocation in mobile ad hoc cloud

    Get PDF
    Mobile Ad Hoc Cloud (MAC) enables the use of a multitude of proximate resource-rich mobile devices to provide computational services in the vicinity. However, inattention to mobile device resources and operational heterogeneity-measuring parameters, such as CPU speed, number of cores, and workload, when allocating task in MAC, causes inefficient resource utilization that prolongs task execution time and consumes large amounts of energy. Task execution is remarkably degraded, because the longer execution time and high energy consumption impede the optimum use of MAC. This paper aims to minimize execution time and energy consumption by proposing heterogeneity-aware task allocation solutions for MAC-based compute-intensive tasks. Results of the proposed solutions reveal that incorporation of the heterogeneity-measuring parameters guarantees a shorter execution time and reduces the energy consumption of the compute-intensive tasks in MAC. A system model is developed to validate the proposed solutions' empirical results. In comparison with random-based task allocation, the proposed five solutions based on CPU speed, number of core, workload, CPU speed and workload, and CPU speed, core, and workload reduce execution time up to 56.72%, 53.12%, 56.97%, 61.23%, and 71.55%, respectively. In addition, these heterogeneity-aware task allocation solutions save energy up to 69.78%, 69.06%, 68.25%, 67.26%, and 57.33%, respectively. For this reason, the proposed solutions significantly improve tasks' execution performance, which can increase the optimum use of MAC. © 2013 IEEE

    6G wireless systems : a vision, architectural elements, and future directions

    Get PDF
    Internet of everything (IoE)-based smart services are expected to gain immense popularity in the future, which raises the need for next-generation wireless networks. Although fifth-generation (5G) networks can support various IoE services, they might not be able to completely fulfill the requirements of novel applications. Sixth-generation (6G) wireless systems are envisioned to overcome 5G network limitations. In this article, we explore recent advances made toward enabling 6G systems. We devise a taxonomy based on key enabling technologies, use cases, emerging machine learning schemes, communication technologies, networking technologies, and computing technologies. Furthermore, we identify and discuss open research challenges, such as artificial-intelligence-based adaptive transceivers, intelligent wireless energy harvesting, decentralized and secure business models, intelligent cell-less architecture, and distributed security models. We propose practical guidelines including deep Q-learning and federated learning-based transceivers, blockchain-based secure business models, homomorphic encryption, and distributed-ledger-based authentication schemes to cope with these challenges. Finally, we outline and recommend several future directions. © 2013 IEEE

    Resource optimized federated learning-enabled cognitive internet of things for smart industries

    Get PDF
    Leveraging the cognitive Internet of things (C-IoT), emerging computing technologies, and machine learning schemes for industries can assist in streamlining manufacturing processes, revolutionizing operational analytics, and maintaining factory efficiency. However, further adoption of centralized machine learning in industries seems to be restricted due to data privacy issues. Federated learning has the potential to bring about predictive features in industrial systems without leaking private information. However, its implementation involves key challenges including resource optimization, robustness, and security. In this article, we propose a novel dispersed federated learning (DFL) framework to provide resource optimization, whereby distributed fashion of learning offers robustness. We formulate an integer linear optimization problem to minimize the overall federated learning cost for the DFL framework. To solve the formulated problem, first, we decompose it into two sub-problems: association and resource allocation problem. Second, we relax the association and resource allocation sub-problems to make them convex optimization problems. Later, we use the rounding technique to obtain binary association and resource allocation variables. Our proposed algorithm works in an iterative manner by fixing one problem variable (for example, association) and compute the other (for example, resource allocation). The iterative algorithm continues until convergence of the formulated cost optimization problem. Furthermore, we compare the proposed DFL with two schemes; namely, random resource allocation and random association. Numerical results show the superiority of the proposed DFL scheme. © 2020 Institute of Electrical and Electronics Engineers Inc.. All rights reserved

    The role of big data analytics in industrial internet of things

    Get PDF
    Big data production in industrial Internet of Things (IIoT) is evident due to the massive deployment of sensors and Internet of Things (IoT) devices. However, big data processing is challenging due to limited computational, networking and storage resources at IoT device-end. Big data analytics (BDA) is expected to provide operational- and customer-level intelligence in IIoT systems. Although numerous studies on IIoT and BDA exist, only a few studies have explored the convergence of the two paradigms. In this study, we investigate the recent BDA technologies, algorithms and techniques that can lead to the development of intelligent IIoT systems. We devise a taxonomy by classifying and categorising the literature on the basis of important parameters (e.g. data sources, analytics tools, analytics techniques, requirements, industrial analytics applications and analytics types). We present the frameworks and case studies of the various enterprises that have benefited from BDA. We also enumerate the considerable opportunities introduced by BDA in IIoT. We identify and discuss the indispensable challenges that remain to be addressed, serving as future research directions. © 2019 Elsevier B.V

    Data Collection in Smart Communities Using Sensor Cloud: Recent Advances, Taxonomy, and Future Research Directions

    Get PDF
    The remarkable miniaturization of sensors has led to the production of massive amounts of data in smart communities. These data cannot be efficiently collected and processed in WSNs due to the weak communication capability of these networks. This drawback can be compensated for by amalgamating WSNs and cloud computing to obtain sensor clouds. In this article, we investigate, highlight, and report recent premier advances in sensor clouds with respect to data collection. We categorize and classify the literature by devising a taxonomy based on important parameters, such as objectives, applications, communication technology, collection types, discovery, data types, and classification. Moreover, a few prominent use cases are presented to highlight the role of sensor clouds in providing high computation capabilities. Furthermore, several open research challenges and issues, such as big data issues, deployment issues, data security, data aggregation, dissemination of control message, and on time delivery are discussed. Future research directions are also provided

    The role of big data analytics in industrial Internet of Things

    Get PDF
    Big data production in industrial Internet of Things (IIoT) is evident due to the massive deployment of sensors and Internet of Things (IoT) devices. However, big data processing is challenging due to limited computational, networking and storage resources at IoT device-end. Big data analytics (BDA) is expected to provide operational- and customer-level intelligence in IIoT systems. Although numerous studies on IIoT and BDA exist, only a few studies have explored the convergence of the two paradigms. In this study, we investigate the recent BDA technologies, algorithms and techniques that can lead to the development of intelligent IIoT systems. We devise a taxonomy by classifying and categorising the literature on the basis of important parameters (e.g. data sources, analytics tools, analytics techniques, requirements, industrial analytics applications and analytics types). We present the frameworks and case studies of the various enterprises that have benefited from BDA. We also enumerate the considerable opportunities introduced by BDA in IIoT.We identify and discuss the indispensable challenges that remain to be addressed as future research directions as well

    Trustworthy Blockchain Gateways for Resource-Constrained Clients and IoT Devices

    Get PDF
    Constrained blockchain clients are unable to process and store the entire blockchain ledger and mine blocks on the blockchain. Such nodes rely on the view of blockchain provided by full nodes termed as gateways. However, gateway nodes can provide a distorted view of the blockchain, making lightweight clients vulnerable to eclipse attack. When under such an attack, a client cannot differentiate between a forked view of the blockchain and the legitimate blockchain ledger leading to fatal consequences and huge losses incurred. To mitigate such threats, we propose a data attestation solution which employs full nodes as validators to attest the responses reported by gateways of lightweight nodes. Leveraging smart contracts, our approach gives lightweight clients confidence in the data reported as they are unable to validate it from the blockchain network itself. The system governs the attestation process that comprises of submitting attestation requests, approving them, recording the response of validators, and manage payments. Clients can, thereafter, provide their feedback about the validator/gateway performance in the form of a reputation score. We present the proposed system architecture and describe its implementation on the Ethereum blockchain network. We evaluated the proposed solution with respect to functionality testing, cost of execution, and security analysis of smart contracts developed.We have also made our smart contracts code publicly available on Github
    corecore